Analytical and numerical study of internal representations in multilayer neural networks with binary weights.

نویسندگان

  • Cocco
  • Monasson
  • Zecchina
چکیده

We study the weight space structure of the parity machine with binary weights by deriving the distribution of volumes associated to the internal representations of the learning examples. The learning behaviour and the symmetry breaking transition are analyzed and the results are found to be in very good agreement with extended numerical simulations. PACS Numbers : 05.20 64.60 87.10 Typeset using REVTEX 1

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning by Choice of Internal Representations

We introduce a learning algorithm for multilayer neural networks composed of binary linear threshold elements. Whereas existing algorithms reduce the learning process to minimizing a cost function over the weights, our method treats the internal representations as the fundamental entities to be determined. Once a correct set of internal representations is arrived at, the weights are found by th...

متن کامل

The High-Dimensional Geometry of Binary Neural Networks

Recent research has shown that one can train a neural network with binary weights and activations at train time by augmenting the weights with a high-precision continuous latent variable that accumulates small changes from stochastic gradient descent. However, there is a dearth of work to explain why one can effectively capture the features in data with binary weights and activations. Our main ...

متن کامل

Analysis of the Internal Representations in Neural Networks for Machine Intelligence

The internal representation of the training patterns of multi-layer perceptrons was examined and we demonstrated that the connection weights between layers are eeectively transforming the representation format of the information from one layer to another one in a meaningful way. The internal code, which can be in analog or binary form, is found to be dependent on a number of factors, including ...

متن کامل

Expectation Backpropagation: Parameter-Free Training of Multilayer Neural Networks with Continuous or Discrete Weights

Multilayer Neural Networks (MNNs) are commonly trained using gradient descent-based methods, such as BackPropagation (BP). Inference in probabilistic graphical models is often done using variational Bayes methods, such as Expectation Propagation (EP). We show how an EP based approach can also be used to train deterministic MNNs. Specifically, we approximate the posterior of the weights given th...

متن کامل

Neural Network Performance Analysis for Real Time Hand Gesture Tracking Based on Hu Moment and Hybrid Features

This paper presents a comparison study between the multilayer perceptron (MLP) and radial basis function (RBF) neural networks with supervised learning and back propagation algorithm to track hand gestures. Both networks have two output classes which are hand and face. Skin is detected by a regional based algorithm in the image, and then networks are applied on video sequences frame by frame in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics

دوره 54 1  شماره 

صفحات  -

تاریخ انتشار 1996